Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added LM Studio Support to Fabric #952

Open
wants to merge 207 commits into
base: goVersion
Choose a base branch
from

Conversation

sosacrazy126
Copy link

photo_2024-09-12_01-34-15

This pull request adds support for LM Studio to Fabric. It includes:

  • A new README.md in the vendors/lmstudio directory with detailed setup and usage instructions.
  • Implementation of the LM Studio functionality within lmstudio.go, enabling integration with local language models.
  • Unit tests for LM Studio in lmstudio_test.go to ensure functionality and reliability.
  • Updates to the existing main README.md file to mention LM Studio support and link to the new documentation.

Please review the changes and let me know if any adjustments are needed.

@JohnnyDisco7
Copy link

@sosacrazy126 You the man! Thank you!

@poezie
Copy link

poezie commented Sep 13, 2024

Did this merge work as it doesn't seem like the update went through or I am missing something ?

@eugeis
Copy link
Collaborator

eugeis commented Sep 13, 2024

Here's an improved version of the message:


@sosacrazy126, thank you for your PR! I have a few points for you to consider:

  1. VendorsManager:
    You've refactored the VendorsManager to work like a factory, but our approach differs. We use a setup procedure where all user-configured vendors are made available after setup is complete. There's no need to modify VendorsManager directly; instead, simply add your new vendor to core/fabric.go like this:
ret.VendorsAll.AddVendors(
  openai.NewClient(), 
  azure.NewClient(), 
  ollama.NewClient(), 
  groc.NewClient(),
  gemini.NewClient(), 
  anthropic.NewClient(), 
  lm.NewClient() // Add your new vendor here
)

With this approach, we don’t require additional flags like --check-lmstudio, as vendor setup and exception handling are already part of the existing process.

  1. Flags (-m/--model):
    Currently, the configured vendor is automatically selected based on the provided model. There's no need to specify the vendor explicitly. If we want the flexibility to use different configured vendors with the same model name, we could consider introducing a --vendor flag in the future.

  2. LLM Studio API:
    From my understanding of the LLM Studio documentation, the API supports OpenAI’s API, which means we might not need extensive new code for LM Studio vendor integration. We can likely reuse the same logic we use for Groq, Azure and others:

func NewClient() (ret *Client) {
  ret = &Client{}
  ret.Client = openai.NewClientCompatible("Groc", "https://api.groq.com/openai/v1", nil)
  return ret
}

type Client struct {
  *openai.Client
}

Let me know if this makes sense or if you have further questions!

@sosacrazy126
Copy link
Author

@eugeis, thank you! I’ll go ahead and implement your suggestion. One question from your overview: Did my implementation seem overly complicated?

@sosacrazy126
Copy link
Author

Did this merge work as it doesn't seem like the update went through or I am missing something ?

im finalizing the merge and testing.

…ad of the system role for the pattern. It is needed for Open AI o1 models for now.
@xssdoctor xssdoctor changed the base branch from main to goVersion September 15, 2024 14:03
@eugeis
Copy link
Collaborator

eugeis commented Sep 16, 2024

Hi, I tried reviewing your updated code, but it's difficult because 90 files have been updated, and it's not clear why. Could you please submit a new pull request that includes only the new feature (LM Studio support)? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.